Eecient Perceptron Learning Using Constrained Steepest Descent Running Title: Eecient Perceptron Learning 1 Eecient Perceptron Learning Using Constrained Steepest Descent
نویسندگان
چکیده
| An algorithm is proposed for training the single-layered per-ceptron. The algorithm follows successive steepest descent directions with respect to the perceptron cost function, taking care not to increase the number of misclassiied patterns. The problem of nding these directions is stated as a quadratic programming task, to which a fast and eeective solution is proposed. The resulting algorithm has no free parameters and therefore no heuristics are involved in its application. It is proved that the algorithm always converges in a nite number of steps. For linearly separable problems, it always nds a hyperplane that completely separates patterns belonging to diierent categories. Termination of the algorithm without separating all given patterns means that the presented set of patterns is indeed linearly inseparable. Thus the algorithm provides a natural criterion for linear sepa-rability. Compared to other state of the art algorithms, the proposed method exhibits substantially improved speed, as demonstrated in a number of demanding benchmark classiication tasks.
منابع مشابه
Efficient perceptron learning using constrained steepest descent
An algorithm is proposed for training the single-layered perceptron. The algorithm follows successive steepest descent directions with respect to the perceptron cost function, taking care not to increase the number of misclassified patterns. The problem of finding these directions is stated as a quadratic programming task, to which a fast and effective solution is proposed. The resulting algori...
متن کاملOn the convergence speed of artificial neural networks in the solving of linear systems
Artificial neural networks have the advantages such as learning, adaptation, fault-tolerance, parallelism and generalization. This paper is a scrutiny on the application of diverse learning methods in speed of convergence in neural networks. For this aim, first we introduce a perceptron method based on artificial neural networks which has been applied for solving a non-singula...
متن کاملLETTER Communicated by Sun - Ichi AmariOn \ Natural " Learning and Pruning in Multilayered PerceptronsTom
Several studies have shown that natural gradient descent for on-line learning is much more eecient than standard gradient descent. In this paper, we derive natural gradients in a slightly diierent manner and discuss implications for batch-mode learning and pruning, linking them to existing algorithms such as Levenberg-Marquardt optimization and optimal brain surgeon. The Fisher matrix plays an ...
متن کاملNips*97 the Eeciency and the Robustness of Natural Gradient Descent Learning Rule Sub-category: Dynamics of Learning Algorithms Category: Theory
We have discovered a new scheme to represent the Fisher information matrix of a stochastic multi-layer perceptron. Based on this scheme, we have designed an algorithm to compute the inverse of the Fisher information matrix. When the input dimension n is much larger than the number of hidden neurons, the complexity of this algorithm is of order O(n 2) while the complexity of conventional algorit...
متن کاملA New Threshold Unit Learning Algorithm
A new algorithm for learning a threshold unit is proposed. The Barycentric Correction Procedure (BCP) is an eecient substitute for the Perceptron and its enhanced versions such as the Thermal Perceptron or the Pocket algorithm. Based on geometrical concepts, the BCP is much more eecient than the Perceptron for learning linearly separable mappings. To deal with linearly nonseparable mappings, ex...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2007